Search results for "Data interpretation"
showing 10 items of 195 documents
Diode laser spectroscopy of the nu(8) band of the SF(5)Cl molecule.
2003
Abstract Diode laser spectra of SF 5 Cl have been recorded in the ν 8 band region at a temperature of ca. 240 K, a pressure of 0.25 mbar and an instrumental bandwidth of ca. 0.001 cm −1 . Four regions have been studied: a first one in the P -branch (906.849–907.687 cm −1 ), a second one in the Q -branch (910.407–910.944 cm −1 ), and two other ones in the R -branch (913.957–914.556 and 917.853–918.705 cm −1 ). The whole ν 1 / ν 8 dyad of SF 5 35 Cl has been previously recorded in the group of Professor H. Burger in Wuppertal, thanks to a Fourier transform infrared spectrometer [J. Mol. Spectrosc. 208 (2001) 169]. These data have thus been combined with our diode laser ones in the aim of refi…
Deterministic chaos and the first positive Lyapunov exponent: a nonlinear analysis of the human electroencephalogram during sleep
1993
Under selected conditions, nonlinear dynamical systems, which can be described by deterministic models, are able to generate so-called deterministic chaos. In this case the dynamics show a sensitive dependence on initial conditions, which means that different states of a system, being arbitrarily close initially, will become macroscopically separated for sufficiently long times. In this sense, the unpredictability of the EEG might be a basic phenomenon of its chaotic character. Recent investigations of the dimensionality of EEG attractors in phase space have led to the assumption that the EEG can be regarded as a deterministic process which should not be mistaken for simple noise. The calcu…
Calculation of NNTs in RCTs with time-to-event outcomes: A literature review
2008
Abstract Background The number needed to treat (NNT) is a well-known effect measure for reporting the results of clinical trials. In the case of time-to-event outcomes, the calculation of NNTs is more difficult than in the case of binary data. The frequency of using NNTs to report results of randomised controlled trials (RCT) investigating time-to-event outcomes and the adequacy of the applied calculation methods are unknown. Methods We searched in PubMed for RCTs with parallel group design and individual randomisation, published in four frequently cited journals between 2003 and 2005. We evaluated the type of outcome, the frequency of reporting NNTs with corresponding confidence intervals,…
Activity of O 6 -methylguanine DNA methyltransferase in mononuclear blood cells of formaldehyde-exposed medical students
1999
A recent study reported that exposure of student embalmers in Cincinnati to high concentrations of formaldehyde (2 mg/m3) reduced the activity of the DNA repair protein O6-methylguanine DNA methyltransferase (MGMT). Reduction in a DNA repair enzyme may strongly increase the cancer risk not only with respect to the repair-enzyme causing agent but with respect to all carcinogens causing lesions subject to repair by the enzyme in question. Thus, we examined whether formaldehyde exposure of 57 medical students during their anatomy course at two different Universities in Germany influenced MGMT activity in mononuclear blood cells. Mean formaldehyde exposure of 41 students was 0.2 +/- 0.05 mg/m3 …
Coupled variable selection for regression modeling of complex treatment patterns in a clinical cancer registry.
2013
For determining a manageable set of covariates potentially influential with respect to a time-to-event endpoint, Cox proportional hazards models can be combined with variable selection techniques, such as stepwise forward selection or backward elimination based on p-values, or regularized regression techniques such as component-wise boosting. Cox regression models have also been adapted for dealing with more complex event patterns, for example, for competing risks settings with separate, cause-specific hazard models for each event type, or for determining the prognostic effect pattern of a variable over different landmark times, with one conditional survival model for each landmark. Motivat…
Inferential tools in penalized logistic regression for small and sparse data: A comparative study.
2016
This paper focuses on inferential tools in the logistic regression model fitted by the Firth penalized likelihood. In this context, the Likelihood Ratio statistic is often reported to be the preferred choice as compared to the ‘traditional’ Wald statistic. In this work, we consider and discuss a wider range of test statistics, including the robust Wald, the Score, and the recently proposed Gradient statistic. We compare all these asymptotically equivalent statistics in terms of interval estimation and hypothesis testing via simulation experiments and analyses of two real datasets. We find out that the Likelihood Ratio statistic does not appear the best inferential device in the Firth penal…
Treating missing data in a clinical neuropsychological dataset--data imputation.
2001
Missing data frequently reduce the applicability of clinically collected data in research requiring multivariate statistics. In data imputation, missing values are replaced by predicted values obtained from models based on auxiliary information. Our aim was to complete a clinical child neuropsychological data set containing 5.2% of missing observations. This was to be used in research requiring multivariate statistics. We compared four data imputation methods by artificially deleting some data. A real-donor imputation method which preserved the parameter estimates and which predicted the observed values with acceptable accuracy was used to complete the data set. In addressing the lack of st…
Weighted Least-Squares Likelihood Ratio Test for Branch Testing in Phylogenies Reconstructed from Distance Measures
2005
A variety of analytical methods is available for branch testing in distance-based phylogenies. However, these methods are rarely used, possibly because the estimation of some of their statistics, especially the covariances, is not always feasible. We show that these difficulties can be overcome if some simplifying assumptions are made, namely distance independence. The weighted least-squares likelihood ratio test (WLS-LRT) we propose is easy to perform, using only the distances and some of their associated variances. If no variances are known, the use of the Felsenstein F-test, also based on weighted least squares, is discussed. Using simulated data and a data set of 43 mammalian mitochondr…
Efficient estimation of generalized linear latent variable models.
2019
Generalized linear latent variable models (GLLVM) are popular tools for modeling multivariate, correlated responses. Such data are often encountered, for instance, in ecological studies, where presence-absences, counts, or biomass of interacting species are collected from a set of sites. Until very recently, the main challenge in fitting GLLVMs has been the lack of computationally efficient estimation methods. For likelihood based estimation, several closed form approximations for the marginal likelihood of GLLVMs have been proposed, but their efficient implementations have been lacking in the literature. To fill this gap, we show in this paper how to obtain computationally convenient estim…
Stochastic Nonlinear Time Series Forecasting Using Time-Delay Reservoir Computers: Performance and Universality
2014
International audience; Reservoir computing is a recently introduced machine learning paradigm that has already shown excellent performances in the processing of empirical data. We study a particular kind of reservoir computers called time-delay reservoirs that are constructed out of the sampling of the solution of a time-delay diFFerential equation and show their good performance in the forecasting of the conditional covariances associated to multivariate discrete-time nonlinear stochastic processes of VEC-GARCH type as well as in the prediction of factual daily market realized volatilities computed with intraday quotes, using as training input daily log-return series of moderate size. We …